Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator

نویسندگان

  • Elena Castilla
  • Nirian Martín
  • Leandro Pardo
  • Konstantinos Zografos
چکیده

In this paper a robust version of the Wald test statistic for composite likelihood is 11 considered by using the composite minimum density power divergence estimator instead of the 12 composite maximum likelihood estimator. This new family of test statistics will be called Wald-type 13 test statistics. The problem of testing a simple and a composite null hypothesis is considered and 14 the robustness is studied on the basis of a simulation study. Previously, the composite minimum 15 density power divergence estimator is introduced and its asymptotic properties are studied. 16

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Estimation in Linear Regression Model: the Density Power Divergence Approach

The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...

متن کامل

Minimum density power divergence estimator for diffusion processes

In this paper, we consider the robust estimation for a certain class of diffusion processes including the Ornstein–Uhlenbeck process based on discrete observations. As a robust estimator, we consider the minimum density power divergence estimator (MDPDE) proposed by Basu et al. (Biometrika 85:549–559, 1998). It is shown that the MDPDE is consistent and asymptotically normal. A simulation study ...

متن کامل

A note on the asymptotic distribution of the minimum density power divergence estimator

Basu et al. [1] and [2] introduce the minimum density power divergence estimator (MDPDE) as a parametric estimator that balances infinitesimal robustness and asymptotic efficiency. The MDPDE depends on a tuning constant α ≥ 0 that controls this trade-off. For α = 0 the MDPDE becomes the maximum likelihood estimator, which under certain regularity conditions is asymptotically efficient, see chap...

متن کامل

Inference on Pr(X > Y ) Based on Record Values From the Power Hazard Rate Distribution

In this article, we consider the problem of estimating the stress-strength reliability $Pr (X > Y)$ based on upper record values when $X$ and $Y$ are two independent but not identically distributed random variables from the power hazard rate distribution with common scale parameter $k$. When the parameter $k$ is known, the maximum likelihood estimator (MLE), the approximate Bayes estimator and ...

متن کامل

Minimum Φ-divergence Estimator and Hierarchical Testing in Loglinear Models

In this paper we consider inference based on very general divergence measures, under assumptions of multinomial sampling and loglinear models. We define the minimum φ-divergence estimator, which is seen to be a generalization of the maximum likelihood estimator. This estimator is then used in a φ-divergence goodness-of-fit statistic, which is the basis of two new statistics for solving the prob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Entropy

دوره 20  شماره 

صفحات  -

تاریخ انتشار 2018